Dimensional mapping of multimodal integration on audiovisual emotion perception

نویسندگان

  • Yoshiko Arimoto
  • Kazuo Okanoya
چکیده

The aim of this research was to investigate what emotions are perceived from incongruent vocal and facial emotional expressions as an integrated emotional expression. Our approach is unimodal and bimodal perceptual emotional information mapping to dimensions of emotional space created with principal component analysis (PCA). Unimodal perception tests and a bimodal congruent/incongruent perception test were conducted with each stimuli in which professional actors expressed four emotions?anger, joy, fear, and sadness?and observers rated the intensity of six emotions (the four expressed emotions plus disgust and surprise) on a six-point scale. A PCA was performed with the scores of each stimulus to create a perceptual emotional space and to compare the difference between unimodal perception and bimodal perception. The results showed that some incongruent emotional expressions were significantly perceived as inconsistent emotions with expressed emotion.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multimodal emotion perception after anterior temporal lobectomy (ATL)

In the context of emotion information processing, several studies have demonstrated the involvement of the amygdala in emotion perception, for unimodal and multimodal stimuli. However, it seems that not only the amygdala, but several regions around it, may also play a major role in multimodal emotional integration. In order to investigate the contribution of these regions to multimodal emotion ...

متن کامل

Dynamic Modeling Approaches for Audiovisual Speech Perception and Multisensory Integration

Multimodal information including auditory, visual and even haptic information is integrated during speech perception. Articulatory information provided by a talker‘s face enhances speech intelligibility in congruent and temporally coincident signals, and produces a perceptual fusion (e.g. the ―McGurk effect‖) when the auditory and visual signals are incongruent. This paper focuses on promising ...

متن کامل

Assessing agreement of observer- and self-annotations in spontaneous multimodal emotion data

We investigated inter-observer agreement and the reliability of self-reported emotion ratings (i.e., self-raters judging their own emotions) in spontaneous multimodal emotion data. During a multiplayer video game, vocal and facial expressions were recorded (including the game content itself) and were annotated by the players themselves on arousal and valence scales. In a perception experiment, ...

متن کامل

Emotional sounds modulate early neural processing of emotional pictures

In our natural environment, emotional information is conveyed by converging visual and auditory information; multimodal integration is of utmost importance. In the laboratory, however, emotion researchers have mostly focused on the examination of unimodal stimuli. Few existing studies on multimodal emotion processing have focused on human communication such as the integration of facial and voca...

متن کامل

Crossmodal integration enhances neural representation of task-relevant features in audiovisual face perception.

Previous studies have shown that audiovisual integration improves identification performance and enhances neural activity in heteromodal brain areas, for example, the posterior superior temporal sulcus/middle temporal gyrus (pSTS/MTG). Furthermore, it has also been demonstrated that attention plays an important role in crossmodal integration. In this study, we considered crossmodal integration ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011